Reservoir regularization stabilizes learning of Echo State Networks with output feedback

نویسندگان

  • René Felix Reinhart
  • Jochen J. Steil
چکیده

Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning can lead to error amplification in these settings and therefore regularization is important for both generalization and reduction of error amplification. We show that regularization of the inner reservoir network mitigates parameter dependencies and boosts the task-specific performance.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularization and stability in reservoir networks with output feedback

Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning affects the output feedback loop and can lead to error amplification. Regularization is therefore important for both, generalization and reduction of error amplification. We show that regularization of the reservoir and the read-out layer reduces the risk of error amplificat...

متن کامل

Balancing of neural contributions for multi-modal hidden state association

We generalize the formulation of associative reservoir computing networks to multiple input modalities and demonstrate applications in image and audio processing scenarios. Robust association with reservoir networks requires to cope with potential error amplification of output feedback dynamics and to handle differently sized input and output modalities. We propose a dendritic neuron model in c...

متن کامل

Optimization and applications of echo state networks with leaky- integrator neurons

Standard echo state networks (ESNs) are built from simple additive units with a sigmoid activation function. Here we investigate ESNs whose reservoir units are leaky integrator units. Units of this type have individual state dynamics, which can be exploited in various ways to accommodate the network to the temporal characteristics of a learning task. We present stability conditions, introduce a...

متن کامل

Studies on reservoir initialization and dynamics shaping in echo state networks

The fixed random connectivity of networks in reservoir computing leads to significant variation in performance. Only few problem specific optimization procedures are known to date. We study a general initialization method using permutation matrices and derive a new unsupervised learning rule based on intrinsic plasticity (IP) for echo state networks. Using three different benchmarks, we show th...

متن کامل

Improving the Prediction Accuracy of Echo State Neural Networks by Anti-Oja's Learning

Echo state neural networks, which are a special case of recurrent neural networks, are studied from the viewpoint of their learning ability, with a goal to achieve their greater prediction ability. A standard training of these neural networks uses pseudoinverse matrix for one-step learning of weights from hidden to output neurons. This regular adaptation of Echo State neural networks was optimi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011